University of Texas at Austin

Upcoming Event: Oden Institute Seminar

A Path Toward Computational Predictive Capability

Bill Oberkampf, Sandia National Laboratories (retired), Consulting Engineer

3:30 – 5PM
Tuesday Oct 21, 2025

POB 6.304 and Zoom

Abstract

Simulation is becoming the primary tool in predicting the performance, reliability, and safety of engineered systems. From a simulation-informed decision-making perspective or a regulatory perspective, the central question is; what is the evidence for simulation credibility? Many contend that higher fidelity physics modeling, combined with faster computers, is the path forward for improved simulation credibility. However, considering the rate of increase in physics complexity as spatial scales decrease, more demanding computational requirements, and more detailed model input data required for finer spatial scales, this argument for improved simulation credibility is questionable. More convincingly, the last five decades of astounding growth in computing speed has not delivered reliable simulation credibility. This lecture argues that improved simulation credibility should be based on the building blocks of code verification, solution verification, model validation, and improved estimation of predictive uncertainty. Predictive uncertainty is the emerging field attempting to capture all sources of uncertainty in order to foretell the response of a system for conditions where no experimental data are available. Categories of uncertainty are model input uncertainty (which includes uncertainties in the system itself, as well as the environments and scenarios to which the system could be exposed), model form uncertainty (due to approximations and assumptions in the formulation of the model), and numerical solution errors of various types. I argue that the estimation of total uncertainty is the most informative and prudent path forward, particularly for high-consequence systems and systems in abnormal and hostile environments. To achieve a comprehensive estimation of total uncertainty, a distinction must be made between uncertainties that are random (aleatory uncertainties) and those that are due to lack of knowledge (epistemic uncertainties). Traditional probability, including Bayesian estimation, is severely constrained in characterizing epistemic uncertainties. Imprecise probability approaches can explicitly include both aleatory and epistemic uncertainties, including contradictory and duplicative information, thereby providing a forthright description of total predictive uncertainty for a decision maker or regulatory authority.

Biography

Dr. William Oberkampf has 55 years of experience in research and development in fluid dynamics, heat transfer, flight dynamics, solid mechanics, and structural dynamics. During the last 30 years, Dr. Oberkampf emphasized research and development in methodologies and procedures for verification, validation, and uncertainty quantification for a wide variety of applications. He has written over 195 journal articles, book chapters, conference papers, and technical reports. His second book, co-authored with Christopher Roy, was recently published: “Verification, Validation and Uncertainty Quantification in Scientific Computing” by Cambridge University Press. He has taught 76 short courses, primarily in the fields of verification, validation, and uncertainty quantification. Dr. Oberkampf received his PhD in Aerospace Engineering in 1970 from the University of Notre Dame. He served on the faculty of the Mechanical Engineering Department at the University of Texas at Austin and served in both staff member and management positions for 29 years at Sandia National Laboratories. Since this time, he has been a consultant to organizations in the U.S. and Europe. He is a fellow of the American Institute of Aeronautics and Astronautics and a Fellow of NAFEMS.

A Path Toward Computational Predictive Capability

Event information

Date
3:30 – 5PM
Tuesday Oct 21, 2025
Location POB 6.304 and Zoom
Hosted by Karen E. Willcox